1395.0 - Essential Statistical Assets for Australia, 2014  
ARCHIVED ISSUE Released at 11:30 AM (CANBERRA TIME) 12/12/2014   
   Page tools: Print Print Page Print all pages in this productPrint All

LESSONS LEARNED
The quality assessment phase of ESA was the first time a national assessment of data quality was undertaken in Australia. As this was the first iteration of the process, there were many lessons learned about both the development of the ESA list and the quality assessment processes which will be addressed in future versions of the ESA initiative.

As a result of knowledge gained from the quality assessment process there were a number of revisions made to the ESA dataset list, as further outlined in the section Revision to the 2013 ESA List and the Revised 2013 ESA List (Appendix 2). There was also feedback received from data custodians that some statistics did not have a full suite of datasets identified during the development of the ESA list or the statistics identified did not best reflect the information required. As the ESA list was developed through an extensive consultation process in the first phase of ESA, these issues were not amended during the quality assessment phase. However, this feedback has been acknowledged and it is intended that the ESA list will be redressed in 2016.

Assessing quality is a complex and multifaceted endeavour. While the quality assessment process was based on a transparent and consistent method developed by the ABS in partnership with data custodians, there is still an element of subjectivity inevitable within the process. Quality can be a nebulous concept as quality dimensions are not mutually exclusive and there are no definitive definitions for categorising or allocating quality indicators within those dimensions. Quality is also a fit for purpose concept where different uses of data and different contexts can change quality priorities and standards. Different users may have a wide range of views about what standards and indicators are appropriate to assess quality, as well as which dimensions they fit within, even when considering data for similar purposes.

For example, one quality indicator, used within the institutional environment dimension, was whether organisations receiving or producing administrative data for statistical purposes provided training for staff on the statistical purpose of the administrative records they entered into the systems. This particular indicator was included on the basis that if staff are trained on the statistical purpose of the administrative records, the quality of information more important to the business of the organisation would not be prioritised over the quality of information important to the statistical purpose. Some users may question the importance of this indicator of quality in comparison with other indicators. Similarly, others may argue it is more relevant to the accuracy dimensions, rather than institutional environment. There is no one approach that satisfies every perspective on issues such as these, but the quality assessment process required concrete standards for all quality indicators, including those which may not be so black and white. More detail about the quality indicators and the standards used for the purpose of the ESA quality assessments at the dataset level are available in the Quality Standards for ESA Reference Guide document in the Downloads tab of this publication.

The quality standards were set for the purpose of ESA. On review of the statistic assessment results and after feedback from data custodians, the standards were generally agreed to be acceptable for this purpose. However, there were some concerns raised, for example, with the accessibility and timeliness dimensions. For accessibility, the benchmark to reach an acceptable level of quality was considered too low and indictors for discoverability were missing. For timeliness, some statistics had components which changed at a different rate to other components, so it was suggested that an all-purpose approach for the timeliness dimension was not always an appropriate measure of quality. The quality indicators and standards used for the assessment process will be reviewed for the next iteration of the ESA quality assessment process.